Pi-Tamagotchi

ECE 5725 Final Project
By Sana Chawla (sc2347) and Lulu Htutt (lh543).


Demonstration Video


Introduction

Tamagotchi is a handheld virtual pet simulation game released in the 1990s, where the user must take care of the pet by feeding and training it. Our project, Pi-Tamagotchi, is inspired by this nostalgic system. We maintain the objective of taking care of virtual pets and keeping them happy, healthy, and fed, but also incorporating a facial recognition component to foster a greater sense of comraderie and connection. We used a Raspberry-Pi 4, a PiTFT, and a Pi camera to create a device reminiscent of the Tamagotchi, and OpenCV and Pygame for the system software.


Generic placeholder image

Project Objective:

  • Create an interactive system where the user has to take care of a virtual creature.
  • Use facial recognition to make Pi-Tamagotchi more interactive and personal.
  • Use a SQL database system to maintain game state.
  • Foster a sense of comraderie by maintaining a user's history of creatures.

Design

Frontend

The Pi-Tamagotchi user interface was implemented using Pygame, displayed on the PiTFT. We were able to center the system around five main screens: Login, Files, Main, Action, and Death. For every menu where the user had options to select, they would navigate through using buttons, and the current choice was indicated by a '>' to the left of the menu item.

The login screen had three options: New Player, Returning Player, and Quit. Choosing either of the two first options would open up a camera that counted down from 5.

Generic placeholder image

Login Design

After logging in as a new/returning user, the user is redirected to a screen with all of their available tamagotchis. For new users, all of the ten slots will be empty, but for returning users, the slots will have their saved tamagotchis and data. Each "file" slot has the tamagotchi's ID and name, as well as an icon. The user can navigate through these slots and select the one they desire to interact with. If an EMPTY slot is selected, a new, random tamagotchi will be born and initialized.

Generic placeholder image

File Selection Design

Once the user chooses a tamagotchi, they will be directed to the main screen, which has the tamagotchi image, the health, hunger, and happiness bars, the in-game time, and a menu with the options menu, back, and quit. Selecting the menu leads the user to the actions that they can do; selecting back returns to the file selection screen.

Generic placeholder image

Main Screen Design

To perform actions, the user can select the "menu" option. This leads to another menu with the options: Info, Feed, Interact, Clean, Back, and Quit. Selecting Info displays information about the tamagotchi, including its ID, name, age, health, hunger, and happiness. Feed initiates a mini-game where the user has to try to catch watermelon falling from the top of the screen. Interact opens up a camera for the user to take a picture. Because the tamagotchi poops incrementally, it is the user's responsibility to clean up the poop. Thus, selecting clean allows the user to tap the screen and clean up after their tamagotchi.

Generic placeholder image

Actions Designs

Backend

In the backend, we used Pygame, OpenCV, and mySQL/Sqlite. We created the game database with three tables: Users, Tamagotchis, and Relations. Users have information such as a UID and an image path for the facial recognition. The tamagotchis table had every possible tamagotchi stored, with information such as Tamagotchi ID, name, and image. The relations table was used to keep track of which users had which tamagotchis, and the specific statistics (age, health, hunger, happiness) of these tamagotchis. We had to set the primary key to the pair (UID, TID) because each entry would had a unique user and corresponding tamagotchi.

For the facial recognition, we used OpenCV, specifically the face_recognition library. Using the Pi-camera, the user would take a photo of themself. New users would save this photo in the user_images folder and would be inserted into the database. Returning users or users who choose to interact with their tamagotchi need to be recognized. After taking a photo, we use the face_recognition library's face_encodings function to extract the features of the face. Then, we compare it to the features of the faces in the database and take the Euclidean distance of these pairings. Finally, using argmin, we select the user with the smallest distance.

Pygame was used to tie all of the features together. We used it to take the user's interactions with the interface and update the screen accordingly. This included pulling data from the database, updating it when feeding/cleaning, turning on the camera when needed, and handling menu navigation.

A feature we wanted to add as an extra was an emotion detection interaction, where the tamagotchi would play a sound or get happier/sadder depending on the user's emotion. We were able to train an emotion detection model with ~76% accuracy on our local machines. However, we ran into many issues when trying to install the tensorflow and keras libraries on the Raspberry-Pi. The version of tensorflow available for the os architecture was only available with Python 3.7, whereas we had Python 3.9. One solution we found was to convert the model to a TFLite model, as we were able to successfully install TFLite on the Raspberry-Pi. Unfortunately, after many attempts, we could not convert the model. The code and model can be found in the github repository.

Hardware

We used the PiTFT to show the game screen, the Pi-camera to take photos, and the four GPIO pins on the Raspberry-Pi. The four pins used were 17, 22, 23, and 27, which were mapped to the up, down, select, and quit functions, respectively. Because the original Tamagotchi system was handheld, we created a cardboard shell that looked like a house. The user could open a "door" to the PiTFT screen, similar to having a pet at home.


Drawings

We illustrated our Tamagotchis, status bars, and tombstone using pixel art.

Generic placeholder image
Capz
Generic placeholder image
Pacmen
Generic placeholder image
Jaredz
Generic placeholder image
Bob
Generic placeholder image
Kin gJuilan
Generic placeholder image
Jonathon
Generic placeholder image
Nola
Generic placeholder image
NYSEG
Generic placeholder image
Epty
Generic placeholder image
Penguin
Generic placeholder image
Fish

Tamagotchis

Generic placeholder image
Happiness
Generic placeholder image
Health
Generic placeholder image
Hunger
Generic placeholder image
Pizza
Generic placeholder image
Death

Other Icons


Testing

Our testing strategy was to isolate components/features of our code and get them to work before combining them into the main game code. For facial recognition, we did testing using the camera.py file and a separate testing database, and once it was working, we added the code into game.py and changed the SQLite commands to interact with the main tamagotchi.db. One of our initial concerns was how accurate the model would be, but after testing on 11 people, it only misclassified one person one time. All game functionality testing was done by just running the code and checking print statements if something wasn't working as intended.


Result

For our demo, the cleaning poop feature was not working as intended, and crashed the software. However, after rebooting the Raspberry-Pi, it started to work again, without any change in code. Everything else worked as intended.


Work Distribution

All work on the project was done together in lab.

Generic placeholder image

Lulu

lh543@cornell.edu

Generic placeholder image

Sana

sc2347@cornell.edu


Parts List

Total: $60.00


References

PiCamera Document
R-Pi GPIO Document
Facial Recognition Library
dlib and face-recognition installation
Pixel Art

Code Appendix

Code can be found on Github